Online Limited-Memory BFGS for Click-Through Rate Prediction

نویسندگان

  • Mitchell Stern
  • Aryan Mokhtari
چکیده

We study the problem of click-through rate (CTR) prediction, where the goal is to predict the probability that a user will click on a search advertisement given information about his issued query and account. In this paper, we formulate a model for CTR prediction using logistic regression, then assess the performance of stochastic gradient descent (SGD) and online limited-memory BFGS (oLBFGS) for use in training the corresponding classifier. We demonstrate empirically that oLBFGS provides faster convergence and requires fewer training examples than SGD to achieve comparable performance, confirming the benefits of the use of second-order information in stochastic optimization.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variable Metric Stochastic Approximation Theory

We provide a variable metric stochastic approximation theory. In doing so, we provide a convergence theory for a large class of online variable metric methods including the recently introduced online versions of the BFGS algorithm and its limited-memory LBFGS variant. We also discuss the implications of our results in the areas of eliciting properties of distributions using prediction markets a...

متن کامل

A Stochastic Quasi-Newton Method for Online Convex Optimization

We develop stochastic variants of the wellknown BFGS quasi-Newton optimization method, in both full and memory-limited (LBFGS) forms, for online optimization of convex functions. The resulting algorithm performs comparably to a well-tuned natural gradient descent but is scalable to very high-dimensional problems. On standard benchmarks in natural language processing, it asymptotically outperfor...

متن کامل

Variable Metric Stochastic Approximation Theory

We provide a variable metric stochastic approximation theory. In doing so, we provide a convergence theory for a large class of online variable metric methods including the recently introduced online versions of the BFGS algorithm and its limited-memory LBFGS variant. We also discuss the implications of our results for learning from expert advice.

متن کامل

Global convergence of online limited memory BFGS

Global convergence of an online (stochastic) limited memory version of the Broyden-FletcherGoldfarb-Shanno (BFGS) quasi-Newton method for solving optimization problems with stochastic objectives that arise in large scale machine learning is established. Lower and upper bounds on the Hessian eigenvalues of the sample functions are shown to suffice to guarantee that the curvature approximation ma...

متن کامل

Criterion for the Limited Memory BFGS Algorithm for Large Scale Nonlinear Optimization

This paper studies recent modi cations of the limited memory BFGS (L-BFGS) method for solving large scale unconstrained optimization problems. Each modi cation technique attempts to improve the quality of the L-BFGS Hessian by employing (extra) updates in certain sense. Because at some iterations these updates might be redundant or worsen the quality of this Hessian, this paper proposes an upda...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015